Information-theoretic privacy in federated submodel learning

نویسندگان

چکیده

We consider information-theoretic privacy in federated submodel learning, where a global server has multiple submodels. Compared to the considered conventional learning secure aggregation is adopted for ensuring privacy, provides stronger protection on selection by local machine. propose an achievable scheme that partially adopts private information retrieval (PIR) achieves minimum amount of download. With respect computation and communication overhead, we compare with naïve approach privacy.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information-theoretic metrics for security and privacy

In this thesis, we study problems in cryptography, privacy and estimation through the information-theoretic lens. We introduce information-theoretic metrics and associated results that shed light on the fundamental limits of what can be learned from noisy data. These metrics and results, in turn, are used to evaluate and design both symmetric-key encryption schemes and privacy-assuring mappings...

متن کامل

An information theoretic approach for privacy metrics

Organizations often need to release microdata without revealing sensitive information. To this scope, data are anonymized and, to assess the quality of the process, various privacy metrics have been proposed, such as k-anonymity, `-diversity, and t-closeness. These metrics are able to capture different aspects of the disclosure risk, imposing minimal requirements on the association of an indivi...

متن کامل

Information-Theoretic Privacy with General Distortion Constraints

The privacy-utility tradeoff problem is formulated as determining the privacy mechanism (random mapping) that minimizes the mutual information (a metric for privacy leakage) between the private features of the original dataset and a released version. The minimization is studied with two types of constraints on the distortion between the public features and the released version of the dataset: (...

متن کامل

Information-Theoretic Foundations of Differential Privacy

We examine the information-theoretic foundations of the increasingly popular notion of differential privacy. We establish a connection between differential private mechanisms and the rate-distortion framework. Additionally, we also show how differentially private distributions arise out of the application of the Maximum Entropy Principle. This helps us locate differential privacy within the wid...

متن کامل

Information Theoretic Learning

of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy INFORMATION THEORETIC LEARNING: RENYI’S ENTROPY AND ITS APPLICATIONS TO ADAPTIVE SYSTEM TRAINING By Deniz Erdogmus May 2002 Chairman: Dr. Jose C. Principe Major Department: Electrical and Computer Engineering Traditionally, second-order ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ICT Express

سال: 2023

ISSN: ['2405-9595']

DOI: https://doi.org/10.1016/j.icte.2022.02.008